8 research outputs found

    Exact and efficient solutions of the LMC Multitask Gaussian Process model

    Full text link
    The Linear Model of Co-regionalization (LMC) is a very general model of multitask gaussian process for regression or classification. While its expressivity and conceptual simplicity are appealing, naive implementations have cubic complexity in the number of datapoints and number of tasks, making approximations mandatory for most applications. However, recent work has shown that under some conditions the latent processes of the model can be decoupled, leading to a complexity that is only linear in the number of said processes. We here extend these results, showing from the most general assumptions that the only condition necessary to an efficient exact computation of the LMC is a mild hypothesis on the noise model. We introduce a full parametrization of the resulting \emph{projected LMC} model, and an expression of the marginal likelihood enabling efficient optimization. We perform a parametric study on synthetic data to show the excellent performance of our approach, compared to an unrestricted exact LMC and approximations of the latter. Overall, the projected LMC appears as a credible and simpler alternative to state-of-the art models, which greatly facilitates some computations such as leave-one-out cross-validation and fantasization.Comment: 21 pages, 5 figures, submitted to AISTAT

    Alzheimer's disease - input of vitamin D with mEmantine assay (AD-IDEA trial): study protocol for a randomized controlled trial

    Get PDF
    BACKGROUND: Current treatments for Alzheimer\u27s disease and related disorders (ADRD) are symptomatic and can only temporarily slow down ADRD. Future possibilities of care rely on multi-target drugs therapies that address simultaneously several pathophysiological processes leading to neurodegeneration. We hypothesized that the combination of memantine with vitamin D could be neuroprotective in ADRD, thereby limiting neuronal loss and cognitive decline. The aim of this trial is to compare the effect after 24 weeks of the oral intake of vitamin D3 (cholecalciferol) with the effect of a placebo on the change of cognitive performance in patients suffering from moderate ADRD and receiving memantine. METHODS: The AD-IDEA Trial is a unicentre, double-blind, randomized, placebo-controlled, intent-to-treat, superiority trial. Patients aged 60 years and older presenting with moderate ADRD (i.e., Mini-Mental State Examination [MMSE] score between 10-20), hypovitaminosis D (i.e., serum 25-hydroxyvitamin D [25OHD] < 30 ng/mL), normocalcemia (i.e., serum calcium < 2.65 mmol/L) and receiving no antidementia treatment at time of inclusion are being recruited. All participants receive memantine 20 mg once daily -titrated in 5 mg increments over 4 weeks- and each one is randomized to one of the two treatment options: either cholecalciferol (one 100,000 IU drinking vial every 4 weeks) or placebo (administered at the same pace). One hundred and twenty participants are being recruited and treatment continues for 24 weeks. Primary outcome measure is change in cognitive performance using Alzheimer\u27s Disease Assessment Scale-cognition score. Secondary outcomes are changes in other cognitive scores (MMSE, Frontal Assessment Battery, Trail Making Test parts A and B), change in functional performance (Activities of Daily Living scale, and 4-item Instrumental Activities of Daily Living scale), posture and gait (Timed Up & Go, Five Time Sit-to-Stand, spatio-temporal analysis of walking), as well as the between-groups comparison of compliance to treatment and tolerance. These outcomes are assessed at baseline, 12 and 24 weeks, together with the serum concentrations of 25OHD, calcium and parathyroid hormone. DISCUSSION: The combination of memantine plus vitamin D may represent a new multi-target therapeutic class for the treatment of ADRD. The AD-IDEA Trial seeks to provide evidence on its efficacy in limiting cognitive and functional declines in ADRD. TRIAL REGISTRATION: ClinicalTrials.gov number, NCT01409694

    An EIM-based compression-extrapolation tool for efficient treatment of homogenized cross-section data

    No full text
    International audienceNuclear reactor simulators implementing the widespread two-steps deterministic calculation scheme tend to produce a large volume of intermediate data at the interface of their two subcodes – up to dozens or even hundred of gigabytes – which can be so cumbersome that it hinders the global performance of the code. The vast majority of this data consists of ‘‘few-groups homogenized cross-sections’’, nuclear quantities stored in the form of tabulated multivariate functions which can be precomputed to a large extent.It has been noticed in Tomatis (2021) that few-groups homogenized cross-sections are highly redundant— that is, they exhibit strong correlations, which paves the way for the use of compression techniques. Wehere pursue this line of work by introducing a new coupled compression/surrogate modeling tool based on the Empirical Interpolation Method, an algorithm originally developed in the framework of partial differential equations (Barrault et al., 2004). This EIM-compression method is based on the infinite norm ∄ ⋅ ∄∞, and proceeds in a greedy manner by iteratively trying to approximate the data and incorporating the chunks of information which cause the largest error. In the process, it generates a vector basis and a set of interpolation points, which provide an elementary surrogate model that can be used to approximate future data from little information. The algorithm is also very suitable for parallelization and out-of-core computation (processing of data too large for the computer RAM) and very easy to apprehend and implement. This method enables us to both efficiently compress cross-sections and spare a large fraction of the required lattice calculations. We investigate its performance on large realistic nuclear data replicating the notorious VERA benchmark (Godfrey, 2014) (20 energy groups, pin-by-pin homogenization, 10 particularized isotopes). Compression loss, memory savings and speed are analyzed both from a data-centric point of view in the perspective of applications in neutronics, and by comparison with an existing and widely-used method – stochastic truncated SVD – to assess mathematical efficiency. We discuss the usage of our surrogate model and its sensitivity to the choice of the training set. The method is shown to be competitive in terms of accuracy and speed, provide important memory savings and spare a large amount of physics code computation; all this could facilitate the adoption of fine-grain modelization schemes (pin-by-pin and many-groups homogenization, particularization of many isotopes) in industrial setups. A Github repository is available,1 which contains all the methods used for the article

    COVID-19 outcomes in patients with inflammatory rheumatic and musculoskeletal diseases treated with rituximab: a cohort study

    No full text
    International audienceBackground: Various observations have suggested that the course of COVID-19 might be less favourable in patients with inflammatory rheumatic and musculoskeletal diseases receiving rituximab compared with those not receiving rituximab. We aimed to investigate whether treatment with rituximab is associated with severe COVID-19 outcomes in patients with inflammatory rheumatic and musculoskeletal diseases.Methods: In this cohort study, we analysed data from the French RMD COVID-19 cohort, which included patients aged 18 years or older with inflammatory rheumatic and musculoskeletal diseases and highly suspected or confirmed COVID-19. The primary endpoint was the severity of COVID-19 in patients treated with rituximab (rituximab group) compared with patients who did not receive rituximab (no rituximab group). Severe disease was defined as that requiring admission to an intensive care unit or leading to death. Secondary objectives were to analyse deaths and duration of hospital stay. The inverse probability of treatment weighting propensity score method was used to adjust for potential confounding factors (age, sex, arterial hypertension, diabetes, smoking status, body-mass index, interstitial lung disease, cardiovascular diseases, cancer, corticosteroid use, chronic renal failure, and the underlying disease [rheumatoid arthritis vs others]). Odds ratios and hazard ratios and their 95% CIs were calculated as effect size, by dividing the two population mean differences by their SD. This study is registered with ClinicalTrials.gov, NCT04353609.Findings: Between April 15, 2020, and Nov 20, 2020, data were collected for 1090 patients (mean age 55·2 years [SD 16·4]); 734 (67%) were female and 356 (33%) were male. Of the 1090 patients, 137 (13%) developed severe COVID-19 and 89 (8%) died. After adjusting for potential confounding factors, severe disease was observed more frequently (effect size 3·26, 95% CI 1·66-6·40, p=0·0006) and the duration of hospital stay was markedly longer (0·62, 0·46-0·85, p=0·0024) in the 63 patients in the rituximab group than in the 1027 patients in the no rituximab group. 13 (21%) of 63 patients in the rituximab group died compared with 76 (7%) of 1027 patients in the no rituximab group, but the adjusted risk of death was not significantly increased in the rituximab group (effect size 1·32, 95% CI 0·55-3·19, p=0·53).Interpretation: Rituximab therapy is associated with more severe COVID-19. Rituximab will have to be prescribed with particular caution in patients with inflammatory rheumatic and musculoskeletal diseases
    corecore